38 research outputs found
A Sparse Multi-Scale Algorithm for Dense Optimal Transport
Discrete optimal transport solvers do not scale well on dense large problems
since they do not explicitly exploit the geometric structure of the cost
function. In analogy to continuous optimal transport we provide a framework to
verify global optimality of a discrete transport plan locally. This allows
construction of an algorithm to solve large dense problems by considering a
sequence of sparse problems instead. The algorithm lends itself to being
combined with a hierarchical multi-scale scheme. Any existing discrete solver
can be used as internal black-box.Several cost functions, including the noisy
squared Euclidean distance, are explicitly detailed. We observe a significant
reduction of run-time and memory requirements.Comment: Published "online first" in Journal of Mathematical Imaging and
Vision, see DO
Dynamic Models of Wasserstein-1-Type Unbalanced Transport
We consider a class of convex optimization problems modelling temporal mass
transport and mass change between two given mass distributions (the so-called
dynamic formulation of unbalanced transport), where we focus on those models
for which transport costs are proportional to transport distance. For those
models we derive an equivalent, computationally more efficient static
formulation, we perform a detailed analysis of the model optimizers and the
associated optimal mass change and transport, and we examine which static
models are generated by a corresponding equivalent dynamic one. Alongside we
discuss thoroughly how the employed model formulations relate to other
formulations found in the literature.Comment: to appear in ESAIM: Control, Optimisation and Calculus of Variation
Convergence of Entropic Schemes for Optimal Transport and Gradient Flows
Replacing positivity constraints by an entropy barrier is popular to
approximate solutions of linear programs. In the special case of the optimal
transport problem, this technique dates back to the early work of
Schr\"odinger. This approach has recently been used successfully to solve
optimal transport related problems in several applied fields such as imaging
sciences, machine learning and social sciences. The main reason for this
success is that, in contrast to linear programming solvers, the resulting
algorithms are highly parallelizable and take advantage of the geometry of the
computational grid (e.g. an image or a triangulated mesh). The first
contribution of this article is the proof of the -convergence of the
entropic regularized optimal transport problem towards the Monge-Kantorovich
problem for the squared Euclidean norm cost function. This implies in
particular the convergence of the optimal entropic regularized transport plan
towards an optimal transport plan as the entropy vanishes. Optimal transport
distances are also useful to define gradient flows as a limit of implicit Euler
steps according to the transportation distance. Our second contribution is a
proof that implicit steps according to the entropic regularized distance
converge towards the original gradient flow when both the step size and the
entropic penalty vanish (in some controlled way)